Computing Gaussian Mixture Models with EM Using Equivalence Constraints
نویسندگان
چکیده
Density estimation with Gaussian Mixture Models is a popular generative technique used also for clustering. We develop a framework to incorporate side information in the form of equivalence constraints into the model estimation procedure. Equivalence constraints are defined on pairs of data points, indicating whether the points arise from the same source (positive constraints) or from different sources (negative constraints). Such constraints can be gathered automatically in some learning problems, and are a natural form of supervision in others. For the estimation of model parameters we present a closed form EM procedure which handles positive constraints, and a Generalized EM procedure using a Markov net which handles negative constraints. Using publicly available data sets we demonstrate that such side information can lead to considerable improvement in clustering tasks, and that our algorithm is preferable to two other suggested methods using the same type of side information.
منابع مشابه
Computing Gaussian Mixture Models with EM using Side-Information
Estimation of Gaussian mixture models is an efficient and popular technique for clustering and density estimation. An EM procedure is widely used to estimate the model parameters. In this paper we show how side information in the form of equivalence constraints can be incorporated into this procedure, leading to improved clustering results. Equivalence constraints are prior knowledge concerning...
متن کاملIMAGE SEGMENTATION USING GAUSSIAN MIXTURE MODEL
Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, ...
متن کاملImage Segmentation using Gaussian Mixture Model
Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...
متن کاملIncremental Learning of Gaussian Mixture Models
Gaussian Mixture Modeling (GMM) is a parametric method for high dimensional density estimation. Incremental learning of GMM is very important in problems such as clustering of streaming data and robot localization in dynamic environments. Traditional GMM estimation algorithms like EM Clustering tend to be computationally very intensive in these scenarios. We present an incremental GMM estimatio...
متن کاملLearning Bayesian Networks under Equivalence Constraints (Abstract)
Machine learning tasks typically assume that the examples of a given dataset are independent and identically distributed (i.i.d.). Yet, there are many domains and applications where this assumption does not strictly hold. Further, there may be additional information available that ties together the examples of a dataset, which we could exploit to learn more accurate models. For example, there a...
متن کامل